LCDNet: Deep Loop Closure Detection and Point Cloud Registration for LiDAR SLAM

نویسندگان

چکیده

Loop closure detection is an essential component of simultaneous localization and mapping (SLAM) systems, which reduces the drift accumulated over time. Over years, several deep learning approaches have been proposed to address this task; however, their performance has subpar compared handcrafted techniques, especially while dealing with reverse loops. In article, we introduce novel loop network (LCDNet) that effectively detects closures in light ranging (LiDAR) point clouds by simultaneously identifying previously visited places estimating six degrees freedom relative transformation between current scan map. LCDNet composed a shared encoder, place recognition head extracts global descriptors, pose estimates two clouds. We based on unbalanced optimal transport theory implement differentiable manner allow for end-to-end training. Extensive evaluations multiple real-world autonomous driving datasets show our approach outperforms state-of-the-art cloud registration techniques large margin, Moreover, integrate into LiDAR SLAM library provide complete system demonstrate generalization ability using different sensor setup unseen city.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hierarchical Registration Method for Airborne and Vehicle LiDAR Point Cloud

A new hierarchical method for the automatic registration of airborne and vehicle light detection and ranging (LiDAR) data is proposed, using three-dimensional (3D) road networks and 3D building contours. Firstly, 3D road networks are extracted from airborne LiDAR data and then registered with vehicle trajectory lines. During the registration of airborne road networks and vehicle trajectory line...

متن کامل

Synchronous Adversarial Feature Learning for LiDAR based Loop Closure Detection

Loop Closure Detection (LCD) is the essential module in the simultaneous localization and mapping (SLAM) task. In the current appearance-based SLAM methods, the visual inputs are usually affected by illumination, appearance and viewpoints changes. Comparing to the visual inputs, with the active property, light detection and ranging (LiDAR) based point-cloud inputs are invariant to the illuminat...

متن کامل

Fusing Bird View LIDAR Point Cloud and Front View Camera Image for Deep Object Detection

We propose a new method for fusing a LIDAR point cloud and camera-captured images in the deep convolutional neural network (CNN). The proposed method constructs a new layer called non-homogeneous pooling layer to transform features between bird view map and front view map. The sparse LIDAR point cloud is used to construct the mapping between the two maps. The pooling layer allows efficient fusi...

متن کامل

Improving 3d Lidar Point Cloud Registration Using Optimal Neighborhood Knowledge

Automatic 3D point cloud registration is a main issue in computer vision and photogrammetry. The most commonly adopted solution is the well-known ICP (Iterative Closest Point) algorithm. This standard approach performs a fine registration of two overlapping point clouds by iteratively estimating the transformation parameters, and assuming that good a priori alignment is provided. A large body o...

متن کامل

On the Effectiveness of Feature-based Lidar Point Cloud Registration

LIDAR systems have been regarded as novel technologies for efficiently acquiring 3-D geo-spatial information, resulting in broad applications in engineering and management fields. Registration of LIDAR point clouds of consecutive scans or different platforms is a prerequisite for fully exploiting advantages of afore-mentioned applications. In this study, the authors integrate point, line and pl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Robotics

سال: 2022

ISSN: ['1552-3098', '1941-0468', '1546-1904']

DOI: https://doi.org/10.1109/tro.2022.3150683